A second-order method for convex l1-regularized optimization with active-set prediction
نویسندگان
چکیده
We describe an active-set method for the minimization of an objective function φ that is the sum of a smooth convex function and an l1-regularization term. A distinctive feature of the method is the way in which active-set identification and second-order subspace minimization steps are integrated to combine the predictive power of the two approaches. At every iteration, the algorithm selects a candidate set of free and fixed variables, performs an (inexact) subspace phase, and then assesses the quality of the new active set. If it is not judged to be acceptable, then the set of free variables is restricted and a new active-set prediction is made. We establish global convergence for our approach, and compare the new method against the state-of-the-art code LIBLINEAR.
منابع مشابه
Proximal Quasi-Newton for Computationally Intensive L1-regularized M-estimators
We consider the class of optimization problems arising from computationally intensive `1-regularized M -estimators, where the function or gradient values are very expensive to compute. A particular instance of interest is the `1-regularized MLE for learning Conditional Random Fields (CRFs), which are a popular class of statistical models for varied structured prediction problems such as sequenc...
متن کاملA family of second-order methods for convex ℓ1-regularized optimization
This paper is concerned with the minimization of an objective that is the sum of a convex function f and an `1 regularization term. Our interest is in methods that incorporate second-order information about the function f to accelerate convergence. We describe a semi-smooth Newton framework that can be used to generate a variety of second-order methods, including block active-set methods, ortha...
متن کاملA Method for Large-Scale l1-Regularized Logistic Regression
Logistic regression with l1 regularization has been proposed as a promising method for feature selection in classification problems. Several specialized solution methods have been proposed for l1-regularized logistic regression problems (LRPs). However, existing methods do not scale well to large problems that arise in many practical settings. In this paper we describe an efficient interior-poi...
متن کاملAn Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization
Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems that uses a preconditioned conjugate gr...
متن کاملExact Regularization of Convex Programs
The regularization of a convex program is exact if all solutions of the regularized problem are also solutions of the original problem for all values of the regularization parameter below some positive threshold. For a general convex program, we show that the regularization is exact if and only if a certain selection problem has a Lagrange multiplier. Moreover, the regularization parameter thre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Optimization Methods and Software
دوره 31 شماره
صفحات -
تاریخ انتشار 2016